Metric Learning via Maximizing the Lipschitz Margin Ratio

نویسندگان

  • Mingzhi Dong
  • Xiaochen Yang
  • Yang Wu
  • Jing-Hao Xue
چکیده

In this paper, we propose the Lipschitz margin ratio and a new metric learning framework for classification through maximizing the ratio. This framework enables the integration of both the inter-class margin and the intra-class dispersion, as well as the enhancement of the generalization ability of a classifier. To introduce the Lipschitz margin ratio and its associated learning bound, we elaborate the relationship between metric learning and Lipschitz functions, as well as the representability and learnability of the Lipschitz functions. After proposing the new metric learning framework based on the introduced Lipschitz margin ratio, we also prove that some well known metric learning algorithms can be shown as special cases of the proposed framework. In addition, we illustrate the framework by implementing it for learning the squared Mahalanobis metric, and by demonstrating its encouraging results on eight popular datasets of machine learning.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Distance-Based Classification with Lipschitz Functions

The goal of this article is to develop a framework for large margin classification in metric spaces. We want to find a generalization of linear decision functions for metric spaces and define a corresponding notion of margin such that the decision function separates the training points with a large margin. It will turn out that using Lipschitz functions as decision functions, the inverse of the...

متن کامل

Multi-granularity distance metric learning via neighborhood granule margin maximization

Learning a distance metric from training samples is often a crucial step in machine learning and pattern recognition. Locality, compactness and consistency are considered as the key principles in distance metric learning. However, the existing metric learning methods just consider one or two of them. In this paper, we develop a multi-granularity distance learning technique. First, a new index, ...

متن کامل

POINT DERIVATIONS ON BANACH ALGEBRAS OF α-LIPSCHITZ VECTOR-VALUED OPERATORS

The Lipschitz function algebras were first defined in the 1960s by some mathematicians, including Schubert. Initially, the Lipschitz real-value and complex-value functions are defined and quantitative properties of these algebras are investigated. Over time these algebras have been studied and generalized by many mathematicians such as Cao, Zhang, Xu, Weaver, and others. Let  be a non-emp...

متن کامل

Weighted Composition Operators Between Extended Lipschitz Algebras on Compact Metric Spaces

‎In this paper, we provide a complete description of weighted composition operators between extended Lipschitz algebras on compact metric spaces. We give necessary and sufficient conditions for the injectivity and the sujectivity of these operators. We also obtain some sufficient conditions and some necessary conditions for a weighted composition operator between these spaces to be compact.

متن کامل

Ordinal margin metric learning and its extension for cross-distribution image data

In machine learning and computer vision fields, a wide range of applications, such as human age estimation and head pose recognition, are related to ordinal data in which there exists an order relationship. To perform such ordinal estimations in a desired metric space, in this work we first propose a novel ordinal margin metric learning (ORMML) method by separating the data classes with a seque...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1802.03464  شماره 

صفحات  -

تاریخ انتشار 2018